Inspired by a discussion at Product Camp, this process article, and a desire to dispel the fear of doing usability testing, this step-by-step guide provides answers on how to successfully do face-to-face usability testing.
This guide is for anyone with who wants to know:
- Does the functionality, as implemented, solve the customer needs elegantly?
- If not, where are they getting stuck?
- How can I do usability without looking foolish?
What usability testing is not
Usability is not for uncovering unmet customer needs, although you’ll gain some ideas. Usability does not drive the development roadmap; it enriches current and future plans.
Usability does not foretell how many customers will embrace your solution; it gives a sense of how ready your solution is.
Usability testing is not about giving up control of the development schedule; you decide what to do next.
With usability testing, you’ll find things that are not good, but are good to know.
When should you do usability testing?
At Product Camp, there was quite the discussion about when to start. “It is never too soon,” some said. “Nonsense,” I say. Conduct usability testing when there’s time to implement changes, but after you’ve worked through common issues that impact users. It’s cheaper to learn from others.
How do you choose what to test?
Don’t test everything. Only test what you’re willing to change. Consider navigation, status, and actions. Ask yourself:
- Where was the dev team uncomfortable?
- Which implementations have significant compromises?
- What are the critical customer problems?
- Which features are unique?
Your biggest challenge should be making the list short enough for a usability script. If that’s not the case, either (a) you have a genius team, or (b) you’re deluding yourself.
How do you develop a usability script?
The script is a process for testing. The more effort that is invested in creating the script upfront, the more it will fade into the background during testing. Here are some tips for developing your own usability script:
- State the customer problem/need (not how to complete the task)
- Sequence tasks logically (e.g., create a file before saving it)
- Don’t befuddle the tester/customer (use their language)
- Use a large font and as few words as possible
- Number tasks so that everyone stays in sync
- Check that the script length (testing session duration) is under 45 minutes
- Print one task per page to prevent testers from reading ahead
Special Note: if you are considering filing patents, you’ll also need testers to sign a non-disclosure agreement (NDA).
Who do you need for usability testing?
Usability testers
I call this first group of people “testers.” They are testing, not being tested. You’re looking for honest, even unpleasant feedback from folks who represent your customer personas. No one on the development team counts. Not even the boss.
How do you find usability testers?
This phase is the one least under your control. Start early, and consider these tactics for finding usability testers:
- Ask existing customers
- Ask colleagues (e.g., if your target audience is parents, are there parent among other teams?)
- Work with sales and marketing
- Ask technical support/customer service
- Ask your friends, and friends of friends
- Add a sign-up form on your website
- When doing other research, ask if you may contact participants again
- Build a list using social media
- Work with a research professional
Special note about technical support and customer service teams: There is nothing typical about the user experience of this group. With deep product knowledge, they don’t talk to your average customer. Feedback from this group is often precious and warped.
How many usability testers do you need?
Jakob Nielsen advocates doing small usability tests with around five testers. In my experience, about 10 are required for confidence in this type of rich, qualitative feedback. With 10 testers, you’ll have more data than you can shake a stick at.
Usability moderator
The moderator chairs the sessions and is the only one who speaks with testers.
A moderator either develops or destroys your usability testing. Team members typically self-select for (and against) this role, which can be outsourced.
What are the key characteristics of a good moderator?
A good moderator is someone who:
- Has experience speaking to customers
- Sets the stage for frank feedback
- Knows how and when to prod (e.g., “I see the mouse is
. What would you expect there?”) - Does not lead the witness
How do you set the stage for frank feedback?
The moderator sets the tone for testing the software, not the testers. Done well, testers won’t be concerned with observers, and will provide frank feedback. Here are some of my favorite phrases:
- “Thanks for helping us test the website.”
- “Your criticism helps make the user experience better.”
- “We won’t be offended. Really!”
How do you avoid leading the witness?
At Product Camp, this question had half the room squirming. Testers will ask for assistance. Moderators will be tempted to help.
Practicing responses to requests for assistance helps. (If you can make them humorous, even better.) Try these phrases:
- “Hmmm, maybe we did a rubbish job. What do you see?” (Moderator looks at the screen.)
- “I could tell you, but then I’d have to be killed. There is no right or wrong answer. What do you think?”
Usability observers
Learning and capturing actionable information is the observer’s role. Each observer participates in at least three sessions. One tester may glide effortlessly through a labyrinth, while another is left baffled. Seeing multiple testers forms a rounded picture.
It’s helpful to have someone participate in all sessions. I find it more tangible to spend a few days observing (and avoid reading reports). Alternatively, have one person get intimidate with that lumpy settee and read every single report.
Where can you find observers?
Is it clear this group is your development team? It’s a role that fits both interns and experts, the shy and the bubbly. Get out from behind the code/UI, and bring your users into sharper focus.
User design and product management teams are usually enthusiastic about this type of testing. To drum up enthusiasm, start there. Engineers often invent solutions as they observe. You’ll likely only have to work to persuade them the first time. Also, involving project sponsors gives them better insight into what their team is crafting.
How many observers do you need?
The answer to this one’s easy: Number of testers x 3 = Number of observers
How should the testing location be set up?
You’ll need screen sharing software which shows the tester’s desktop, full screen. (The image below suggests how to lay out the room.)
Some people choose to record the sessions. In years of doing usability testing, I have never seen a team use recordings.
How do you collect data and measure usability?
And what does “usable” mean, anyway?
You’re working to learn and understand the core answers to two main questions:
1. Does the implementation solve the customer problem/task? That is, did the tester complete the task?
2. What did it take for the customer to complete the task? This subjective question provides rich feedback. Typically, teams are surprised at the range of approaches testers take.
Recording usability observation: Free template for user testing observation spreadsheet
Collating free-form text kills enthusiasm for many worthy projects. Your liberator? The spreadsheet.
Provide observers with a digital or paper spreadsheet. This is a team effort, which means that notes are for sharing. With document sharing options available though platforms such as Google Drive, the schedule, template, feedback, and analysis can reside in a single spreadsheet that is accessible to all team members.
Download the JEM 9 User Testing Observation Spreadsheet Template free of charge now.
How do you analyze the usability output?
If you are strict about using the user testing observation spreadsheet, this process is straightforward:
- Without changing the formatting, copy all rows from your observers’ feedback into one spreadsheet (rows seven and below are in the example shown)
- Sort by task
- To locate the worst features, first add a subtotal for each task, then add up the total minutes
- Filter out completed tasks (those marked “yes”)
- Decide on a course of action (next steps)
Start with the incomplete tasks that took the longest. You may decide to consider completed tasks later, but fry the biggest fish first. Now you’re back in regular development prioritization mode.
Usability testing summary checklist
- Usability testers
- Usability script for testers
- Observers
- User testing observation spreadsheet template
- Schedule of testers
- Software to test
- Screen sharing tool
- Minimum of two laptops or PCs
- Testing location
- Moderator’s script
- Moderator
Keep in mind that the framework laid out above has numerous moving parts, many of which your team may not be ready to undertake. If you’re going to commit to doing usability testing, however, you must be ready, and you must commit to doing it right. Otherwise, you risk gathering insights that are useless.
Performing usability testing is not for the faint of heart, but the insights gleaned from the endeavor can better connect your brand to its core audience in ways you likely never imagined.
What other tips can you share with the community to encourage folks to take advantage of usability testing?